AIbase
Home
AI Tools
AI Models
MCP
AI NEWS
EN
Model Selection
Tags
Expert Mixture Architecture

# Expert Mixture Architecture

Tinymistral 6x248M Instruct
Apache-2.0
A language model fine-tuned based on the Mixture of Experts (MoE) architecture, which fuses multiple models through the LazyMergekit framework and performs excellently in instruction tasks.
Large Language Model Transformers English
T
M4-ai
1,932
9
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
English简体中文繁體中文にほんご
© 2025AIbase